uniform design
Gliding over the Pareto Front with Uniform Designs
Multiobjective optimization (MOO) plays a critical role in various real-world domains. A major challenge therein is generating K uniform Pareto-optimal solutions to represent the entire Pareto front. To address this issue, this paper firstly introduces \emph{fill distance} to evaluate the K design points, which provides a quantitative metric for the representativeness of the design. However, directly specifying the optimal design that minimizes the fill distance is nearly intractable due to the nested \min-\max-\min optimization problem. To address this, we propose a surrogate max-packing'' design for the fill distance design, which is easier to optimize and leads to a rate-optimal design with a fill distance at most 4\times the minimum value.
Model-free Subsampling Method Based on Uniform Designs
Zhang, Mei, Zhou, Yongdao, Zhou, Zheng, Zhang, Aijun
Subsampling or subdata selection is a useful approach in large-scale statistical learning. Most existing studies focus on model-based subsampling methods which significantly depend on the model assumption. In this paper, we consider the model-free subsampling strategy for generating subdata from the original full data. In order to measure the goodness of representation of a subdata with respect to the original data, we propose a criterion, generalized empirical F-discrepancy (GEFD), and study its theoretical properties in connection with the classical generalized L2-discrepancy in the theory of uniform designs. These properties allow us to develop a kind of low-GEFD data-driven subsampling method based on the existing uniform designs. By simulation examples and a real case study, we show that the proposed subsampling method is superior to the random sampling method. Moreover, our method keeps robust under diverse model specifications while other popular subsampling methods are under-performing. In practice, such a model-free property is more appealing than the model-based subsampling methods, where the latter may have poor performance when the model is misspecified, as demonstrated in our simulation studies.
- Asia > China > Hong Kong (0.04)
- Asia > China > Tianjin Province > Tianjin (0.04)
- North America > United States > New York (0.04)
- (3 more...)
Hyperparameter Optimization via Sequential Uniform Designs
Hyperparameter tuning or optimization plays a central role in the automated machine learning (AutoML) pipeline. It is a challenging task as the response surfaces of hyperparameters are generally unknown, and the evaluation of each experiment is expensive. In this paper, we reformulate hyperparameter optimization as a kind of computer experiment and propose a novel sequential uniform design (SeqUD) for hyperparameter optimization. It is advantageous as a) it adaptively explores the hyperparameter space with evenly spread design points, which is free of the expensive meta-modeling and acquisition optimization procedures in Bayesian optimization; b) sequential design points are generated in batch, which can be easily parallelized; and c) a real-time augmented uniform design (AugUD) algorithm is developed for the efficient generation of new design points. Experiments are conducted on both global optimization tasks and hyperparameter optimization applications. The results show that SeqUD outperforms related hyperparameter optimization methods, which is demonstrated to be a promising and competitive alternative of existing tools.
- Asia > China > Hong Kong (0.04)
- North America > United States > California (0.04)
- Research Report > Experimental Study (0.67)
- Research Report > New Finding (0.66)